Compare Page

Statistical validity

Characteristic Name: Statistical validity
Dimension: Validity
Description: Computed data must be statistically valid
Granularity: Information object
Implementation Type: Process-based approach
Characteristic Type: Usage

Verification Metric:

The number of tasks failed or under performed due to lack of statistical validity in data
The number of complaints received due to lack of statistical validity of data

GuidelinesExamplesDefinitons

The implementation guidelines are guidelines to follow in regard to the characteristic. The scenarios are examples of the implementation

Guidelines: Scenario:
Establish the population of interest unambiguously with appropriate justification (maintain documentation) (1) Both credit customers and cash customers are considered for a survey on customer satisfaction.
Establish an appropriate sampling method with appropriate justification (1) Stratified sampling is used to investigate drug preference of the medical officers
Establish statistical validity of samples -avoid over coverage and under coverage (maintain documentation) (1) Samples are taken from all income levels in a survey on vaccination
Maintain consistency of samples in case longitudinal analysis is performed. (Maintain documentation) (1) Same population is used over the time to collect epidemic data for a longitudinal analysis
Ensure that valid statistical methods are used to enable valid inferences about data, valid comparisons of parameters and generalise the findings. (1) Poisson distribution is used to make inferences since data generating events are occurred in a fixed interval of time and/or space
Ensure that the acceptable variations for estimated parameters are established with appropriate justifications (1) 95% confidence interval is used in estimating the mean value
Ensure that appropriate imputation measures are taken to nullify the impact of problems relating to outliers, data collection and data collection procedures and the edit rules are defined and maintained. (1) Incomplete responses are removed from the final data sample

Validation Metric:

How mature is the process to maintain statistical validity of data

These are examples of how the characteristic might occur in a database.

Example: Source:
if a column should contain at least one occurrence of all 50 states, but the column contains only 43 states, then the population is incomplete. Y. Lee, et al., “Journey to Data Quality”, Massachusetts Institute of Technology, 2006.

The Definitions are examples of the characteristic that appear in the sources provided.

Definition: Source:
Coherence of data refers to the internal consistency of the data. Coherence can be evaluated by determining if there is coherence between different data items for the same point in time, coherence between the same data items for different points in time or coherence between organisations or internationally. Coherence is promoted through the use of standard data concepts, classifications and target populations. HIQA 2011. International Review of Data Quality Health Information and Quality Authority (HIQA), Ireland. http://www.hiqa.ie/press-release/2011-04-28-international-review-data-quality.
1) Accuracy in the general statistical sense denotes the closeness of computations or estimates to the exact or true values.

2) Coherence of statistics is their adequacy to be reliably combined in different ways and for various uses.

LYON, M. 2008. Assessing Data Quality ,
Monetary and Financial Statistics.
Bank of England. http://www.bankofengland.co.uk/
statistics/Documents/ms/articles/art1mar08.pdf.

 

Accuracy to reality

Characteristic Name: Accuracy to reality
Dimension: Accuracy
Description: Data should truly reflect the real world
Granularity: Record
Implementation Type: Process-based approach
Characteristic Type: Usage

Verification Metric:

The number of tasks failed or under performed due to lack of accuracy to reality
The number of complaints received due to lack of accuracy to reality

GuidelinesExamplesDefinitons

The implementation guidelines are guidelines to follow in regard to the characteristic. The scenarios are examples of the implementation

Guidelines: Scenario:
Continuously evaluate if the existing data model is sufficient to represent the real world as required by the organisational need and do the necessary amendments to the data model if needed. (1) A student who received a concession travel card is not eligible for a concession fare if he terminates his candidature before completion of the course. Hence the data model should have an extra attribute as "current status of candidature"
Perform regular audits on mission critical data to verify that every record has a meaningful existence in the reality which is useful for the organisation (1) All customers existing in the customer master file actually a customer in the customer space open for the organisation. (non customers are not in the customer file) (2) "Greg Glass" is recorded as a glass work company but in fact they are opticians
(3) A person's personal details taken from his educational profile may not be a correct reality for his insurance profile even though the information is
Perform regular audits on mission critical data to verify that every record has a unique existence in the reality (1) It is difficult to find out that the professor "Andrew" is from Colombia university or from the university of Queensland
Ensure that Information available in the system is accurate in the context of a particular activity or event (1) The driver details taken from vehicle registration may not be accurate in the case of finding the real person who drive the vehicle when an accident is caused

Validation Metric:

How mature is the process to ensure the accuracy to reality

These are examples of how the characteristic might occur in a database.

Example: Source:
if the name of a person is John, the value v = John is correct, while the value v = Jhn is incorrect C. Batini and M, Scannapieco, “Data Quality: Concepts, Methodologies, and Techniques”, Springer, 2006.
Percent of values that are correct when compared to the actual value. For example, M=Male when the subject is Male. P. Cykana, A. Paul, and M. Stern, “DoD Guidelines on Data Quality Management” in MIT Conference on Information Quality - IQ, 1996, pp. 154-171.
an EMPLOYEE entity (identified by the Employee-Number

314159) and the attribute Year-of-Birth. If the value of Year-of-Birth for employee 314159 is the year the employee was born, the datum is correct.

C. Fox, A. Levitin, and T. Redman, “The Notion of Data and Its Quality Dimensions” in Journal Information Processing and Management: an International Journal archive, Volume 30 Issue 1, Jan-Feb 1994, 1992, pp. 9-19.
Consider a database that contains names, addresses, phone numbers, and e- mail addresses of physicians in the state of Texas. This database is known to have a number of errors: some records are wrong, some are missing, and some are obsolete. If you compare the database to the true population of physicians, it is expected to be 85% accurate. If this database is to be used for the state of Texas to notify physicians of a new law regarding assisted suicide, it would certainly be considered poor quality. In fact, it would be dangerous to use it for that intended purpose.

24

2.1 Data Quality Definitions 25

If this database were to be used by a new surgical device manufacturer to find potential customers, it would be considered high quality. Any such firm would be delighted to have a potential customer database that is 85% accurate. From it, they could conduct a telemarketing campaign to identify real sales leads with a completely acceptable success rate. The same database: for one use it has poor data quality, and for another it has high data quality.

J. E. Olson, “Data Quality: The Accuracy Dimension”, Morgan Kaufmann Publishers, 9 January 2003.
The patient’s identification details are correct and uniquely identify the patient. P. J. Watson, “Improving Data Quality: A Guide for Developing Countries”, World Health Organization, 2003.

The Definitions are examples of the characteristic that appear in the sources provided.

Definition: Source:
Determines the extent to which data objects correctly represent the real-world values for which they were designed. For example, the sales orders for the Northeast region must be assigned a Northeast sales representative. D. McGilvray, “Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information”, Morgan Kaufmann Publishers, 2008.
The data value correctly reflects the real-world condition. B. BYRNE, J. K., D. MCCARTY, G. SAUTER, H. SMITH, P WORCESTER 2008. The information perspective of SOA design Part 6:The value of applying the data quality analysis pattern in SOA. IBM corporation.
The data correctly reflects the Characteristics of a Real-World Object or Event being described. Accuracy and Precision represent the highest degree of inherent Information Quality possible. ENGLISH, L. P. 2009. Information quality applied: Best practices for improving business information, processes and systems, Wiley Publishing.
Is the information precise enough and close enough to reality? EPPLER, M. J. 2006. Managing information quality: increasing the value of information in knowledge-intensive products and processes, Springer.
1) Each identifiable data unit maps to the correct real-world phenomenon.

2) Non-identifying (i.e. non-key) attribute values in an identifiable data unit match the property values for the represented real-world phenomenon.

3) Each identifiable data unit represents at least one specific real-world phenomenon.

4) Each identifiable data unit represents at most one specific real-world phenomenon.

PRICE, R. J. & SHANKS, G. Empirical refinement of a semiotic information quality framework. System Sciences, 2005. HICSS'05. Proceedings of the 38th Annual Hawaii International Conference on, 2005. IEEE, 216a-216a.
1) The degree to which an information object correctly represents another information object, process, or phenomenon in the context of a particular activity or culture.

2) Closeness of agreement between a property value and the true value (value that characterizes a characteristic perfectly defined in the conditions that exists when the characteristic is considered.

3) The extent to which the correctness of information is verifiable or provable in the context of a particular activity.

STVILIA, B., GASSER, L., TWIDALE, M. B. & SMITH, L. C. 2007. A framework for information quality assessment. Journal of the American Society for Information Science and Technology, 58, 1720-1733.